Skip to content

Conversation

@pytorchbot
Copy link
Collaborator

This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #12132 by @kimishpatel
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/kimishpatel/196/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/kimishpatel/196/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/gh/kimishpatel/195/orig
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/kimishpatel/196/orig
@diff-train-skip-merge

…ssful

At the moment we continue execution and the stack fails later on as I found when running with quantize kv cache + ring attention

Differential Revision: [D77516822](https://our.internmc.facebook.com/intern/diff/D77516822/)

ghstack-source-id: 293635304
Pull Request resolved: #12129
Now that we support quantized sdpa query tensor can be quantized and attention mask can be float (the only type allowed).
So this check doesnt make sense anymore.

Differential Revision: [D77516821](https://our.internmc.facebook.com/intern/diff/D77516821/)

ghstack-source-id: 293661338
Pull Request resolved: #12131
… and sdpa

When using quantized kv cache and SDPA, there was two bugs:
1. It did not reset return_float_values of QuantizedRingKVCache. Which results in QuantizedKVCache returning float values post dequant.
2. For quantized kv cache, SDPA module stores kv_cache that is owned by attention module. When replacing kv cache in Attention we have to make sure that we change the reference in SDPA as well.

Differential Revision: [D77516823](https://our.internmc.facebook.com/intern/diff/D77516823/)

ghstack-source-id: 293661340
Pull Request resolved: #12132
@pytorch-bot
Copy link

pytorch-bot bot commented Jul 1, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/12143

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure, 2 Unrelated Failures

As of commit 2530b33 with merge base 9905026 (image):

NEW FAILURE - The following job has failed:

FLAKY - The following job failed but was likely due to flakiness present on trunk:

BROKEN TRUNK - The following job failed but was present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 1, 2025
Base automatically changed from gh/kimishpatel/195/orig to main July 3, 2025 00:01
@github-actions
Copy link

github-actions bot commented Jul 3, 2025

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@kimishpatel kimishpatel merged commit f11e4d3 into main Jul 3, 2025
99 of 102 checks passed
@kimishpatel kimishpatel deleted the gh/kimishpatel/196/orig branch July 3, 2025 03:41
Tanish2101 pushed a commit to Tanish2101/executorch that referenced this pull request Jul 9, 2025
… and sdpa (pytorch#12143)

This PR was created by the merge bot to help merge the original PR into
the main branch.
ghstack PR number: pytorch#12132 by
@kimishpatel
^ Please use this as the source of truth for the PR details, comments,
and reviews
ghstack PR base:
https://github.com/pytorch/executorch/tree/gh/kimishpatel/196/base
ghstack PR head:
https://github.com/pytorch/executorch/tree/gh/kimishpatel/196/head
Merge bot PR base:
https://github.com/pytorch/executorch/tree/gh/kimishpatel/195/orig
Merge bot PR head:
https://github.com/pytorch/executorch/tree/gh/kimishpatel/196/orig
@diff-train-skip-merge

---------

Co-authored-by: Kimish Patel <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants